Hadoop FS: Use the widest range of surfaces to manipulate any file system.Hadoop DFS and HDFs DFS: can only operate on HDFs file system-related (including operations with local FS), which is already deprecated, typically using the latter.The following reference is from StackOverflowFollowing is the three commands which appears same but has minute differences
Ha
Hadoop FS: Use the widest range of surfaces to manipulate any file system.Hadoop DFS and HDFs DFS: can only operate on HDFs file system-related (including operations with local FS), which is already deprecated, typically using the latter.The following reference is from StackOverflowFollowing is the three commands which appears same but has minute differences
Ha
Hadoop Shell commands
Use bin/hadoop FS
1. cat
Description: outputs the content of the specified file in the path to stdout.
Usage: hadoop fs-cat URI [URI…]
Example:
hadoopfs-cathdfs://host1:port1/file1hdfs://host2:port2/file2
hadoopfs-catfile:///file3/user/hadoop/
[jobMainClass] [jobArgs]
Killing a running JOB
Hadoop job-kill job_20100531_37_0053
More HADOOP commands
Hadoop
You can see the description of more commands:
Namenode-format the DFS filesystem
Secondarynamenode run the DFS secondary namenode
Namenode run the DFS namenode
Da
After installing the Hadoop pseudo-distributed environment, executing the relevant commands (for example: Bin/hdfs dfs-ls) will appearWARN util. nativecodeloader:unable to load Native-hadoop library for your platform ... using Builtin-java classes where applicable, which is Because the installed Navtive packages and platforms do not match, the
. ADDNL is optional and is used to specify that a newline character be added at the end of each file.
Ls
How to use: Hadoop fs-ls
If it is a file, the file information is returned in the following format:
File name If it is a directory, it returns a list of its immediate subfolders, as in Unix. The information for the catalog return list is as follows:
Direct
Introduction to some common commands in hadoop. Assume that the Hadoop installation directory HADOOP_HOME is homeadminhadoop. Start and close Hadoop1. open the HADOOP_HOME directory. 2. run the shbinstart-all.sh to close Hadoop1. go to HADOOP_HOM. suppose Hadoop's installation directory HADOOP_HOME is/home/admin/hadoop
Assume that the installation directory for Hadoop is Hadoop_home/home/admin/hadoop.
Start and closeStart Hadoop1. Enter the Hadoop_home directory.
2. Execute SH bin/start-all.sh
Turn off Hadoop1. Enter the Hadoop_home directory.2. Execute SH bin/stop-all.shFile operationsHadoop uses HDFs, which is similar in functionality to the disk systems we use. and wildcard characters such as * are supported.
View a
This article provides a detailed analysis of some commonly used commands in hadoop. For more information, see Hadoop installation directory HADOOP_HOME:/home/admin/hadoop.
Start and closeStart Hadoop1. go to the HADOOP_HOME directory.
2. execute sh bin/start-all.sh
Disable Hadoop1. go to the HADOOP_HOME directory.2.
Assume that the Hadoop installation directory HADOOP_HOME is/home/admin/hadoop.Start and closeStart Hadoop1. Go to the HADOOP_HOME directory.2. Execute sh bin/start-all.sh
Disable Hadoop1. Go to the HADOOP_HOME directory.2. Execute sh bin/stop-all.shFile OperationsHadoop uses HDFS to implement functions similar to the disk system we use. Wildcard characters are also supported, such *.
View the file listView the files in the/user/admin/aaron directory
Assume that the Hadoop installation directory HADOOP_HOME is/home/admin/hadoop.Start and closeStart Hadoop1. Go to the HADOOP_HOME directory.2. Execute sh bin/start-all.sh
Disable Hadoop1. Go to the HADOOP_HOME directory.2. Execute sh bin/stop-all.shFile OperationsHadoop uses HDFS to implement functions similar to the disk system we use. Wildcard characters are also supported, such *.
View the file listView the files in the/user/admin/aaron directory
region:#hbase> major_compact ‘r1‘, ‘c1‘#Compact a single column family within a table:#hbase> major_compact ‘t1‘, ‘c1‘
Configuration Management and node restart1) Modify the HDFs configurationHDFs Configuration Location:/etc/hadoop/conf
# 同步hdfs配置cat /home/hadoop/slaves|xargs -i -t scp /etc/hadoop/conf/hdfs-site.x
Suppose the installation directory for Hadoop is hadoop_home to/home/admin/hadoop.Start and closeStart Hadoop1. Enter the Hadoop_home directory.2. Perform SH bin/start-all.sh
Turn off Hadoop1. Enter the Hadoop_home directory.2. Perform SH bin/stop-all.shFile actionsHadoop uses HDFs, which is similar to the disk system we use. and supports wildcard characters, such as *.
View File ListView the files in the/user/admin/aaron directory in HDFs.1. Enter
Preface: Well, it's just a little bit more comfortable without writing code, but we can't slack off. The hive operation's file needs to be loaded from hereSimilar to the Linux commands, the command line begins with the Hadoop FS -(dash) LS / list file or directory cat Hadoop FS -cat ./hello.txt/opt/old/ Htt/h
/ci_cuser_20141231141853691/* ' >ci_cusere_20141231141853691.csv echo $?~/.bash_profile: Each user can use this file to enter shell information dedicated to their own use, when the user logs on, theThe file is only executed once! By default, he sets some environment variables to execute the user's. bashrc file.Hadoop fs-cat ' $1$2/* ' >$3.csvMV $3.csv/home/ocdc/cocString command = "CD" + Ciftpinfo.getftppath () + "" +hadooppath+ "Hadoop fs-cat '/user
A. Common Hadoop commands1. The FS command for Hadoop#查看hadoop所有的fs命令Hadoop FS#上传文件 (both put and copyfromlocal are upload commands)Hadoop fs-put jdk-7u55-linux-i586.tar.gz hdfs://hucc01:9000/jdkhadoop fs-copyfromlocal jdk-7u55-li
success:mysql-h172.16.77.15-uroot-p123 mysql-h host address-u user name-P user PasswordView Character SetsShow variables like '%char% ';To Modify a character set:VI/ETC/MY.CNF add Default-character-set=utf8 under [client]create sudo without password loginTo set the Aboutyun user with no password sudo permissions: Chmode u+w/etc/sudoersaboutyun all= (root) nopasswd:allchmod u-w/etc/sudoers test: sudo ifconfigUbuntu View Service List codesudo service--
The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion;
products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the
content of the page makes you feel confusing, please write us an email, we will handle the problem
within 5 days after receiving your email.
If you find any instances of plagiarism from the community, please send an email to:
info-contact@alibabacloud.com
and provide relevant evidence. A staff member will contact you within 5 working days.